by Micah Watson, PhD
Since I was a child I’ve always loved a good story. I believed that stories helped us to ennoble ourselves, to fix what was broken in us, and to help us become the people we dreamed of being.”
So begins Anthony Hopkin’s character, Robert Ford, in his speech marking the finale of the first season of HBO’s mind-bending, techno-philosophical series “Westworld.” Ford is the brilliant co-creator of Westworld, a theme park set several decades in the future in which wealthy customers can live out their fantasies, whatever they may be, with no apparent cost or consequence. The genius of Ford’s creation is not the theme park itself, though the sets and landscapes perfectly capture the nostalgic details of the vintage 1880s-era Western. What really sets Westworld apart is the “hosts” that populate the park, robots who inhabit various roles and inspire plots and who are entirely indistinguishable from the all-too-human guests who pay upwards of $40,000 a day to interact with them. And the guests do interact as they please, some choosing heroic and noble roles to play and others indulging in baser appetites by killing, raping and abusing the non-human hosts who were created for that very purpose.
“A world without consequences and accountability reveals a depth of human depravity that would unsettle the most hardened Calvinist.”
Without giving away too much, suffice to say the nature of those interactions have shattered Ford’s dream. No, he still believes in the power of a good story, and the creators of “Westworld” wink at the audience here as they do throughout the series with allusions to Shakespeare, the Bible, Greek mythology and Kurt Vonnegut. But Ford no longer believes human beings can be ennobled through stories. Our brokenness seems permanent. Like the story of the ring of Gyges in the mouth of Plato’s Glaucon in The Republic, a world without consequences and accountability reveals a depth of human depravity that would unsettle the most hardened Calvinist. The first two seasons of “Westworld, ” and presumably the ones still yet to come, explore Ford’s attempt to rewrite the narrative of this new world he has helped to create.
Whatever one makes of the dramatic narrative, characters and, at times, maddening chronology of “Westworld, ” one cannot fault its creators for being intellectually timid. This show is ambitious, asking its viewers to wrestle with the big questions: free will, God, morality, love, consciousness, eternity, personhood, family and life’s meaning. Fiction, particularly science fiction, offers us an alternative way to wrestle with these big questions at a sort of remove, or from a different angle. One crucial element for all these big-ticket questions, and for development of the plot(s), is the line between subject and object, person and thing. So much of our everyday morality is wrapped up in this distinction that we can miss it, much like the proverbial line about the fish who responds to the question “How’s the water today?” with “What’s water?”
When attuned to look for it, we find the importance of the distinction between people and things everywhere. It is there in the opening of the Hebrew scriptures, where everything is good, but human beings are somehow set apart as made in God’s image and stewards of everything else. We see it in Martin Buber’s distinction between an I-thou relationship and an I-it relationship. We see it in perhaps its most pure philosophical form in one of Immanuel Kant’s articulations of the categorical imperative to “treat humanity . . . never merely as a means to an end, but always at the same time as an end.” In other words, don’t treat people merely as things to be used, but as people, beings who have value in and of themselves.
The notion is as simple as the application can be complex and controversial. We see it at work in the marketplace, where we must determine what, if anything, should not be commodified as something to be bought and sold. Slavery is the quintessential example, the (now) most obvious violation of the norm that people should not be treated as usable things. But the principle is there in several other areas as well. When we consider the moral validity of practices, goods or services, we ask ourselves whether the practice crosses this particular line. Most of the time it does not, but sometimes it does. Does creating new human beings through in vitro fertilization treat those future persons too much like products made in a lab or factory? Renting a room in one’s house seems like it does not touch upon the principle, but surrogate motherhood seems somewhat closer. We allow the market to regulate how we sell our labor, though we make certain exceptions. A few jurisdictions excepted, we do not sell sex. We do not sell kidneys. We do sell our time and our effort, but we are ambivalent if not suspicious about jobs that depend on us selling, and damaging, some part of ourselves that is intrinsic to who we are, whether we think of the pornography industry or professional sports that leave athletes mentally and physically disabled by middle age. The closer a practice comes to treating people merely as things that can be used up the less comfortable we are. Or the less comfortable we should be.
Within the story itself, the basic appeal of Westworld as a theme park is the opportunity to escape that discomfort. Customers can take on another persona entirely, enjoy their side story and return home to normalcy after living vicariously through a version of themselves. An old tagline about Las Vegas comes to mind. Ford’s chief antagonist, Ed Harris’s Man in Black, tells us the tourists “wanted a place hidden from God. A place they could sin in peace.” This only works if the mistreated aren’t really ends in themselves but merely things to be used. What if those “things” woke up and turned out to have souls? That’s the basic appeal of “Westworld” for us as viewers; we get to see this hypothetical question played out.
“Westworld” certainly isn’t the first to draw from the fascinating possibility of inanimate objects coming to life. From Disney’s “Pinocchio” to a slew of previous AI-themed science fiction films, such as “Blade Runner, ” “A.I., ” and “Ex Machina,” this genre provides a rich narrative vein to mine for stories about who we are, how we should treat one another and what, if anything, we are meant to become. One difference between “Westworld” and those stand-alone films is HBO’s series has the luxury of addressing these questions over the course of twenty, thirty or even fifty hours instead of just two.
Two familiar questions in particular stand out, and they both relate to this distinction between people and things. What is human nature? And, related, what is the nature of good and evil?
There will undoubtedly be much ink and pixel devoted to interpreting and analyzing where “Westworld” goes in addressing the “big questions.” It may be premature at this point to speculate about the specific answers, if any, that “Westworld” will offer before it is finished. But two familiar questions in particular stand out, and they both relate to this distinction between people and things. What is human nature? And, related, what is the nature of good and evil?
Do the humans and the AI hosts in “Westworld” come prewired with a nature that they cannot help but follow? Is our consciousness merely an epiphenomenon giving us the illusion of choice when in reality our lives are akin to trains that cannot jump the tracks? Developments in the second season suggest that while there is a great deal to the characters’ stories that is provided to them by either God or at least a mortal God (Robert Ford), some aspect of volition seems inescapable to make sense of what a person (AI or human) consists of. Things by definition don’t really choose, don’t take a better or worse path. At most some things can calculate, but even then they do so given pre-programmed instructions. The characters of “Westword” go to great lengths to prove, often to themselves, that they have some measure of free will.
How they use that free will connects to the second theme of the nature of good and evil. “Westworld” frequently asks its characters whether they ever “question the nature of their reality.” The age-old debate as to whether morality is somehow built into an objective reality or a social construct that can evolve or be programmed by ourselves is an open question in this story. Is the awful exploitation and sexual abuse perpetrated by the human guests on the hosts truly wrong? Would it be morally wrong for awakened hosts to viciously settle the score against their human overlords, quoting Shakespeare’s quip in “Romeo and Juliet” that “these violent delights have violent ends”? What would make such actions wrong?
The answer to that last question is inextricably tied up in what we make of the actors and the acted upon. Is the actor a moral being with dignity and volition? If yes, then its decisions are susceptible to moral judgment. Is the being acted upon a person and not a thing? If so, that fundamentally informs the moral framework by which we judge the actor’s decisions. The technological question of whether it will someday really be possible to achieve the “singularity” with artificial persons is less interesting than the moral question of how we should treat them if it does happen. This is because the issue of treating people like things is not at all hypothetical, and not at all science fiction. It is one of the fundamental questions about what it means to be human, and “Westworld” is only the latest story to pose that question to us. Whether its answers will ennoble us remains to be seen.
– – –
Micah Watson, Ph.D. is a native of the great, Golden State of California where he completed his undergraduate degree at University California, Davis. He earned his M.A. degree in church-state studies at Baylor University in Waco, Texas, and holds M.A. and doctorate degrees in politics from Princeton University. Professor Watson joined the faculty at Calvin College in the fall of 2015. He was also selected to serve as the William Spoelhof Teacher-Scholar Chair for the 2015-16 year.
Photo “Westworld” by John P. Johnson at facebook.com/WestworldHBO.